Frontier Pruning for Shift-Reduce CCG Parsing
نویسندگان
چکیده
We apply the graph-structured stack (GSS) to shift-reduce parsing in a Combinatory Categorial Grammar (CCG) parser. This allows the shift-reduce parser to explore all possible parses in polynomial time without resorting to heuristics, such as beam search. The GSSbased shift-reduce parser is 34% slower than CKY in the finely-tuned C&C parser. We perform frontier pruning on the GSS, increasing the parsing speed to be competitive with the C&C parser with a small accuracy penalty.
منابع مشابه
LSTM Shift-Reduce CCG Parsing
We describe a neural shift-reduce parsing model for CCG, factored into four unidirectional LSTMs and one bidirectional LSTM. This factorization allows the linearization of the complete parsing history, and results in a highly accurate greedy parser that outperforms all previous beam-search shift-reduce parsers for CCG. By further deriving a globally optimized model using a task-based loss, we i...
متن کاملShift-Reduce CCG Parsing
CCGs are directly compatible with binarybranching bottom-up parsing algorithms, in particular CKY and shift-reduce algorithms. While the chart-based approach has been the dominant approach for CCG, the shift-reduce method has been little explored. In this paper, we develop a shift-reduce CCG parser using a discriminative model and beam search, and compare its strengths and weaknesses with the c...
متن کاملExpected F-Measure Training for Shift-Reduce Parsing with Recurrent Neural Networks
We present expected F-measure training for shift-reduce parsing with RNNs, which enables the learning of a global parsing model optimized for sentence-level F1. We apply the model to CCG parsing, where it improves over a strong greedy RNN baseline, by 1.47% F1, yielding state-of-the-art results for shiftreduce CCG parsing.
متن کاملShift-Reduce CCG Parsing with a Dependency Model
This paper presents the first dependency model for a shift-reduce CCG parser. Modelling dependencies is desirable for a number of reasons, including handling the “spurious” ambiguity of CCG; fitting well with the theory of CCG; and optimizing for structures which are evaluated at test time. We develop a novel training technique using a dependency oracle, in which all derivations are hidden. A c...
متن کاملNeural Shift-Reduce CCG Semantic Parsing
We present a shift-reduce CCG semantic parser. Our parser uses a neural network architecture that balances model capacity and computational cost. We train by transferring a model from a computationally expensive loglinear CKY parser. Our learner addresses two challenges: selecting the best parse for learning when the CKY parser generates multiple correct trees, and learning from partial derivat...
متن کامل